A Scaled Conjugate Gradient Algorithm for Fast Supervised Learning
نویسندگان
چکیده
منابع مشابه
A scaled conjugate gradient algorithm for fast supervised learning
A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate is introduced. The algorithm is based upon a class of optimization techniques well known in numerical analysis as the Conjugate Gradient Methods. SCG uses second order information from the neural network but requires only O(N) memory usage, where N is the number of weights in the network. The perf...
متن کاملA new hybrid conjugate gradient algorithm for unconstrained optimization
In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...
متن کاملBackpropagation Learning for Multi-layer Feed-forward Neural Networks Using the Conjugate Gradient Method. Ieee Transactions on Neural Networks, 1991. [31] M. F. Mller. a Scaled Conjugate Gradient Algorithm for Fast Supervised Learning. Technical Report Pb-339
متن کامل
An E cient PCA-type Learning Based on Scaled Conjugate Gradient Algorithm for Fast Signal Subspace Decomposition
Nonlinear PCA type learning has been recently suggested for signal subspace decomposition and sinusoidal frequencies tracking, which outperformed the linear PCA based methods and traditional least squares algorithms. Currently, nonlinear PCA algorithms are directly generalized from linear ones that based on gradient descent (GD) technique. The convergence behavior of gradient descent is depende...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: DAIMI Report Series
سال: 1990
ISSN: 2245-9316,0105-8517
DOI: 10.7146/dpb.v19i339.6570